geometric descent method
Geometric Descent Method for Convex Composite Minimization
In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate $(1-1/\sqrt{\kappa})$ and thus achieves the optimal rate among first-order methods, where $\kappa$ is the condition number of the problem. Numerical results on linear regression and logistic regression with elastic net regularization show that GeoPG compares favorably with Nesterov's accelerated proximal gradient method, especially when the problem is ill-conditioned.
- Asia > China > Hong Kong (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Asia > Middle East > Jordan (0.04)
Reviews: Geometric Descent Method for Convex Composite Minimization
Summary: The paper extends the recent work of Bubeck et al. on the geometric gradient descent method, in order to handle non-smooth (but "nice") objectives. Similar work can be found in [10]; however, no optimal rates are obtained in that case (in the sense of having a square root condition number in the retraction factor). The core ideas and motivation origin from the proximity operators used in non-smooth optimization in machine learning and signal processing (see e.g., lasso). Quality: The paper is of good technical quality - for clarity see below. The results could be considered "expected" since similar results (from smooth to non-smooth convex objectives with proximal operators) have been proved in numerous papers the past decade; especially under convexity assumptions.
Geometric Descent Method for Convex Composite Minimization
Shixiang Chen, Shiqian Ma, Wei Liu
In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh [1] to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate (1 1/ κ) and thus achieves the optimal rate among first-order methods, where κ is the condition number of the problem. Numerical results on linear regression and logistic regression with elastic net regularization show that GeoPG compares favorably with Nesterov's accelerated proximal gradient method, especially when the problem is ill-conditioned.
- Asia > China > Hong Kong (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Asia > Middle East > Jordan (0.04)
Geometric Descent Method for Convex Composite Minimization
Chen, Shixiang, Ma, Shiqian, Liu, Wei
In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate $(1-1/\sqrt{\kappa})$ and thus achieves the optimal rate among first-order methods, where $\kappa$ is the condition number of the problem. Numerical results on linear regression and logistic regression with elastic net regularization show that GeoPG compares favorably with Nesterov's accelerated proximal gradient method, especially when the problem is ill-conditioned. Papers published at the Neural Information Processing Systems Conference.
Geometric Descent Method for Convex Composite Minimization
Chen, Shixiang, Ma, Shiqian, Liu, Wei
In this paper, we extend the geometric descent method recently proposed by Bubeck, Lee and Singh to tackle nonsmooth and strongly convex composite problems. We prove that our proposed algorithm, dubbed geometric proximal gradient method (GeoPG), converges with a linear rate $(1-1/\sqrt{\kappa})$ and thus achieves the optimal rate among first-order methods, where $\kappa$ is the condition number of the problem. Numerical results on linear regression and logistic regression with elastic net regularization show that GeoPG compares favorably with Nesterov's accelerated proximal gradient method, especially when the problem is ill-conditioned.
- Asia > China > Hong Kong (0.04)
- North America > United States > Massachusetts > Suffolk County > Boston (0.04)
- North America > United States > California > Los Angeles County > Long Beach (0.04)
- Asia > Middle East > Jordan (0.04)